Learning Sentence-Level Representations with Predictive Coding

نویسندگان

چکیده

Learning sentence representations is an essential and challenging topic in the deep learning natural language processing communities. Recent methods pre-train big models on a massive text corpus, focusing mainly representation of contextualized words. As result, these cannot generate informative embeddings since they do not explicitly exploit structure discourse relationships existing contiguous sentences. Drawing inspiration from human processing, this work explores how to improve sentence-level pre-trained by borrowing ideas predictive coding theory. Specifically, we extend BERT-style with bottom-up top-down computation predict future sentences latent space at each intermediate layer networks. We conduct extensive experimentation various benchmarks for English Spanish languages, designed assess sentence- discourse-level pragmatics-focused assessments. Our results show that our approach improves consistently both languages. Furthermore, experiments also indicate capture pragmatics knowledge. In addition, validate proposed method, carried out ablation study qualitative which verified mechanism helps quality representations.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning Visually Grounded Sentence Representations

We introduce a variety of models, trained on a supervised image captioning corpus to predict the image features for a given caption, to perform sentence representation grounding. We train a grounded sentence encoder that achieves good performance on COCO caption and image retrieval and subsequently show that this encoder can successfully be transferred to various NLP tasks, with improved perfor...

متن کامل

Learning Predictive State Representations

We introduce the rst algorithm for learning predictive state representations (PSRs), which are a way of representing the state of a controlled dynamical system. The state representation in a PSR is a vector of predictions of tests, where tests are sequences of actions and observations said to be true if and only if all the observations occur given that all the actions are taken. The problem of ...

متن کامل

Learning Word Representations with Hierarchical Sparse Coding

We propose a new method for learning word representations using hierarchical regularization in sparse coding inspired by the linguistic study of word meanings. We show an efficient learning algorithm based on stochastic proximal methods that is significantly faster than previous approaches, making it possible to perform hierarchical sparse coding on a corpus of billions of word tokens. Experime...

متن کامل

A predictive coding framework for rapid neural dynamics during sentence-level language comprehension.

There is a growing literature investigating the relationship between oscillatory neural dynamics measured using electroencephalography (EEG) and/or magnetoencephalography (MEG), and sentence-level language comprehension. Recent proposals have suggested a strong link between predictive coding accounts of the hierarchical flow of information in the brain, and oscillatory neural dynamics in the be...

متن کامل

A Predictive Coding Perspective on Beta Oscillations during Sentence-Level Language Comprehension

Oscillatory neural dynamics have been steadily receiving more attention as a robust and temporally precise signature of network activity related to language processing. We have recently proposed that oscillatory dynamics in the beta and gamma frequency ranges measured during sentence-level comprehension might be best explained from a predictive coding perspective. Under our proposal we related ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Machine learning and knowledge extraction

سال: 2023

ISSN: ['2504-4990']

DOI: https://doi.org/10.3390/make5010005